专利摘要:
Method, system and program product for interaction in virtual reality environments using a desktop force feedback haptic device (502). Method (100) comprises: defining (102) an action space (210) in a virtual reality environment corresponding to a workspace (510) of the haptic desktop force feedback device (502); upon detection (104) of a modification order (512) of the action space (210), modifying (106) the size of the action space (210) as a function of the modification order (512) detected; determining (107) the location of the modified action space (210') to keep the position of the virtual avatar (206) unaltered; and mapping (108) the workspace (510) of the haptic desktop force feedback device (502) to the modified action space (210'); upon detection (110) of a repositioning order (514) of the action space (210), moving (112) the action space (210) to a new location in the virtual reality environment as a function of the repositioning order (514) detected. (Machine-translation by Google Translate, not legally binding)
公开号:ES2807674A1
申请号:ES202031083
申请日:2020-10-29
公开日:2021-02-23
发明作者:Llamas Camino Fernández;Costales Gonzalo Esteban;Fernández Alexis Gutiérrez
申请人:Universidad de Leon;
IPC主号:
专利说明:

[0004] FIELD OF THE INVENTION
[0005] The object of the invention is framed in the field of computing, more specifically in that of haptic devices for the feedback of desktop forces and their interaction with virtual reality.
[0007] BACKGROUND OF THE INVENTION
[0008] Desktop force feedback haptic devices are those characterized by stimulating the user's proprioception, allowing the user to feel the force generated by an object when touched or manipulated. Several studies have made use of desktop force feedback haptic devices in conjunction with virtual reality headsets (HMDs, "Head-Mounted Displays") for their purposes, generally the training of specific skills or the rehabilitation of affected areas of the human anatomy. . Its use in training exercises is exemplified in the study by Hashimoto et al. [1], in which haptic devices and HMDs are used together in a scaling simulator. In this simulator, the patient's mouth is visualized through 1HMD and the stylus of the haptic device is matched with the dental tool used to remove tartar. The simulated image is constructed using augmented reality on position markers of the work environment and, since the positioning of the patient is important for the removal task, the position and rotation of the environment that houses the markers can be physically modified to reposition the denture. .
[0010] In the study carried out in Ji et al. [2] two large haptic devices are used in conjunction with a simulation of the bones and tissues that make up the back that allows users to palpate and explore them. In this study, two forms of visualization are considered: one through an HMD, and the other through a hologram formed by a system of mirrors. The use of this type of device in rehabilitation exercises is exemplified in the study carried out by Andaluz et al. [3], in which the Oculus Rift virtual reality glasses are used for greater immersion, the device of Leap Motion monitoring for the selection of exercises and a Novint Falcon haptic device for carrying out the exercises, consisting of moving objects, following paths in which different forces act, etc. In the study by Pruna et al. [4] a specific system for children is developed in which, through games, rehabilitation exercises of the upper extremities are performed. The visualization is done through the Oculus Rift and the games consist of, for example, watering the plants or collecting objects to their corresponding boxes using the haptic device. Saad et al. [5] performs an integration of the sense of sight and touch in the same experience of interaction with anatomical models of the human body. For the visualization they used the Oculus Rift virtual reality glasses, combined with the interaction through the haptic device. Movement around the scene is carried out by auxiliary input devices that are manipulated with the non-dominant hand, that is, keyboard, joystick or mouse.
[0012] In the documents mentioned above, desktop force feedback haptic devices are used in immersive virtual reality environments without the application of an interaction technique or model. The non-application of an interaction technique means that the area that can be manipulated within the virtual scene is limited to the size of the physical workspace of the haptic device, considerably reducing the user's abilities to interact with the entire scene in the one who is immersed.
[0014] Patent ES2716012-B2 considers the interaction of a desktop force feedback haptic device in a virtual reality environment, where the haptic device sends a zoom command to enlarge or reduce the virtual scene represented. Said invention is based on maintaining at all times a relationship between the virtual size of the haptic device workspace and the magnification level applied to a virtual scene, thus allowing the user to interact at any time with any element that is viewed on the screen. scene regardless of the magnification level applied. In this way, by increasing the magnification level of the scene, the user not only gets a visual zoom that allows them to see objects of interest more clearly, but also gets a haptic zoom, thanks to which their movements become more precise in the area of interest.
[0016] Given a virtual scene visualized through a virtual reality helmet or visor, the proposed invention addresses the integration in said scene of the haptic devices of Feedback of desktop forces, making it possible to interact with any element of the scene with the desired level of precision. Thus, the present invention addresses the problem of the limited workspace available to these types of devices.
[0018] BIBLIOGRAPHY
[0019] [1] Hashimoto, N., Kato, H., & Matsui, K. (2007, November). Training of Tooth Scaling by Simulator - Development of Simulator and Investigation of its Effectiveness--. In 17th International Conference on Artificial Reality and Telexistence (ICAT 2007) (pp. 251-257). IEEE.
[0021] [2] Ji, W., Williams, R. L., Howell, J. N., & Conatser, R. R. (2006, March). 3D stereo viewing evaluation for the virtual haptic back project. In 200614th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (pp. 251-258). IEEE.
[0023] [3] Andaluz, V. H., Salazar, P. J., Escudero, M., Bustamante, C., Silva, M., Quevedo, W., ... & Rivas, D. (2016, December). Virtual reality integration with force feedback in upper limb rehabilitation. In International Symposium on Visual Computing (pp. 259-268). Springer, Cham.
[0025] [4] Pruna, E., Acurio, A., Tigse, J., Escobar, I., Pilatásig, M., & Pilatásig, P. (2017, June). Virtual system for upper limbs rehabilitation in children. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics (pp. 107-118). Springer, Cham.
[0027] [5] Saad, E., Funnell, W. R. J., Kry, P. G., & Ventura, N. M. (2018, October). A Virtual-Reality System for Interacting with Three-Dimensional Models Using a Haptic Device and a Head-Mounted Display. In 2018 IEEE Life Sciences Conference (LSC) (pp. 191-194). IEEE.
[0029] DESCRIPTION OF THE INVENTION
[0030] The present invention consists of a new interaction model applicable to desktop force feedback haptic devices that allows its use in immersive virtual reality environments.
[0032] A first aspect of the present invention refers to a method of interaction in virtual reality environments by means of a haptic device for feedback of desktop forces, wherein a virtual avatar of a desktop force feedback haptic device is rendered in virtual scenes by a user-carried virtual reality viewer.
[0034] The method comprises defining an action space in a virtual reality environment, where said action space corresponds to the workspace of the haptic desktop force feedback device; When an action space modification order is detected, the size of the action space is modified, increasing or decreasing by a certain amount the dimensions of the action space depending on the detected modification order and the workspace of the haptic desktop force feedback device with modified action space; and when an action space relocation order is detected, the action space is moved to a new location in the virtual reality environment based on the detected relocation order.
[0036] The invention solves the problem of limited workspace inherent in haptic desktop force feedback devices. The invention allows these haptic devices to be used for a natural and precise interaction with any element present in an immersive virtual reality scene. The interaction model is based on two modes of operation: interaction mode and relocation mode.
[0038] In the first place, the interaction mode bases its operation on the definition of a correspondence between the working space of the haptic device in the real world and a specific space within the virtual scene called 'action space' (or action area). The action space remains immobile within the virtual scene while the interaction mode is activated, which allows the user to interact through the haptic device with the elements that are within the action space, as well as change their point of view of the scene by moving the head, monitored by the virtual reality viewer.
[0040] While the interaction mode is active, the space or workspace of the haptic device in the virtual scene (ie the action space) does not change position, but the size of the action space is allowed to be reduced or enlarged to obtain greater precision. or greater freedom of movement, respectively. Therefore, within the interaction mode, the user can change the size of the action space, thus achieving the level of precision or freedom of movement desired at all times. Resizing the space of The action is applied (i) without altering the relative position of the virtual avatar of the haptic device (which represents the end effector of the haptic device for feedback of desktop forces in the virtual scene) within the action space before and after the modification of the size of the action space, and (ii) without altering the absolute position of the virtual avatar in the virtual reality environment. In this way, the avatar maintains its relative position within the virtual workspace and the physical position of the effector of the haptic device is not modified by the change in size. Both the virtual avatar (effector in the virtual world) and the end effector (effector in the real world) of the haptic desktop force feedback device do not change position. By increasing or decreasing the size of the action space, the virtual avatar always stays in the same position, which provides great naturalness in the interaction with the user. When the virtual avatar is located in the center of the action space at the time of resizing the action space, only the size of the action space is enlarged, without changing the position of the center of the modified action space. However, when the virtual avatar is not centered in the action space when its size is modified, the necessary operations are carried out so that, after expanding or reducing the action space, the virtual avatar remains in exactly the same position absolute position of the virtual reality environment and in the same relative position (with respect to the action environment) in which it was before. This requires changing the center of the modified action space (or moving the action space).
[0042] Due to the correspondence between the dimensions of the working space of the haptic device (real world) and the dimensions of the action space (virtual world), the smaller the action space the user obtains a higher level of precision, while if the The size of the action space is larger, you get more freedom of movement.
[0044] For its part, the repositioning mode allows repositioning the workspace of the haptic device at any location within the virtual scene. In other words, the relocation mode allows the user to move the action space to the desired location in the virtual scene. In this way, by relocating the action space, the user can interact with any object or element of the virtual scene.
[0046] The relocation mode can be activated, for example, by means of an interaction (eg a double press) with any of the buttons of the haptic device or by looking at a certain area of the virtual scene and, once activated, the action space moves following the user's gaze. The new position of the action space within the scene is confirmed when the user exits the relocation mode, that is, when they return to the interaction mode. As long as the relocation mode is active, the user can select, in addition to the angular position of the action space with respect to the user, the distance of the action space to the user.
[0048] The invention contemplates different ways of making the change between operating modes (interaction and relocation):
[0050] - Buttons on the haptic device. If the haptic device used has buttons, one of them can be used as a mechanism for switching between operating modes. Thus, within interaction mode, when the user presses the button, it triggers the change to relocation mode. Once the action space is in the desired location within the relocation mode, you can fix that location and return to the interaction mode by activating that button again. An alternative to pressing the button to switch between modes is to activate the repositioning mode only while the button on the haptic device is pressed, so the user must select the new location of the action space while holding down the button, fixing the same in its new location when you release the button. Another alternative is to double click on one of the buttons of the haptic device to trigger the change between operating modes.
[0052] - Voice commands. The change between the different operating modes can also be done by specific voice commands that ad hoc software is able to recognize, process and act accordingly.
[0054] - Fixation of the gaze. It consists of placing a mode change area within the virtual scene, in which if the user fixes his gaze (ie he focuses his gaze, detected by the orientation of the virtual reality viewer or by eye tracking sensors installed in the viewer virtual reality, in the event that it incorporates this type of sensors) in it for a few seconds the change between the interaction mode and the relocation mode is made. Once in relocation mode, if the user fixes their gaze for a few seconds on the desired location for the action space, it will remain fixed on that location and they will return to interaction mode. As an aid to the user, the virtual scene can show load bars that are filled according to the time that is find the user by looking at the mode change area or desired location for the action space.
[0056] The invention also considers various ways to perform the possible actions contemplated to activate the modification of the size of the action space in the interaction mode and the repositioning of the action space in the relocation mode.
[0058] Modifying the size of the action space can be done in several different ways, for example:
[0060] - Buttons on the haptic device. If the haptic device has buttons, these can be used to change the size of the action space within the interaction mode. Staggered modifications (by pressing the button) or continuous modifications (as long as the button is held down) are contemplated. If the device has more than one button available, a button can be used to increase the size of the action space and another button to decrease the size of the action space. On the contrary, if the haptic device only has one button, the enlargement can be performed by making a single press on the button, and the opposite effect with a double press of the button.
[0062] - Voice commands. The use of voice commands to modify the size of the action space is also contemplated. As for the change between operating modes, the software used is capable of recognizing, processing and acting accordingly by increasing or decreasing the size of the action space.
[0064] Modifying the distance from the action space to the user can also be done in multiple ways, including the following:
[0066] - Buttons on the haptic device. If the haptic device has buttons, these can be used to change the distance from the action space to the user within the relocation mode. Staggered distance modifications (by pressing the button) or continuous modifications of the distance (as long as the button is held down) are contemplated. If the desktop haptic force feedback device has more than one button, one of them can be used to increase the distance and another to decrease it. On the contrary, if the haptic device has a single button, the distance can be increased by making a single press on the button, and the distance can be reduced by making a double press of the button.
[0067] - Voice commands. The use of voice commands to modify the distance from the action space to the user is also contemplated. Software is capable of recognizing the specific command, processing it and acting accordingly to increase or decrease said distance.
[0069] A second aspect of the present invention refers to an interaction system in virtual reality environments using a haptic desktop force feedback device, which implements the method described above. The system comprises a desktop force feedback haptic device for interacting with a virtual reality environment; a graphic processing unit in charge of generating virtual scenes of the virtual reality environment in which a virtual avatar of the haptic desktop force feedback device is represented; and a virtual reality viewer to show a user the generated virtual scenes. The graphics processing unit is configured to implement the described method.
[0071] BRIEF DESCRIPTION OF THE DRAWINGS
[0072] Next, a series of figures are described very briefly that help to better understand the invention and that are expressly related to an embodiment of said invention that is presented as a non-limiting example thereof.
[0074] Figure 1 illustrates the different stages of an interaction method in virtual reality environments by means of a haptic desktop force feedback device, according to an embodiment of the present invention.
[0076] Figures 2A and 2B show the schematic operation of the proposed method in the interaction mode, and in particular during the modification of the size of the action space.
[0078] Figure 3 represents the operation of the proposed method in the repositioning mode, and in particular during the modification of the angular position of the action space.
[0080] Figure 4 represents the schematic operation of the relocation mode when the user modifies the distance at which the action space is located.
[0082] Figure 5 shows, according to a possible embodiment, the interaction system in virtual reality environments using a haptic force feedback device. working desktop, when the user sends an order to modify the size of the action space.
[0084] Figures 6A and 6B illustrate, respectively, the content of a modification order and a relocation order according to one embodiment.
[0086] PREFERRED EMBODIMENT OF THE INVENTION
[0087] The present invention refers to a method of interaction in virtual reality environments by means of a haptic desktop force feedback device, where a virtual avatar of a haptic desktop force feedback device is represented in virtual scenes by means of a virtual reality viewer. of a user.
[0089] In Figure 1 the different steps of the method 100 are represented, according to one embodiment. First, an action space is defined 102 in a virtual reality environment, where said action space corresponds to the workspace of a desktop force feedback haptic device. Once the virtual reality environment has been defined, the detection 104 of an order to modify the action space is continuously checked on the one hand, and on the other hand the detection 110 of an order to relocate the action space. Such checks can be performed, for example, in parallel or one after the other, in a loop. The detected commands are generated from the action of a user, for example by pressing a button on the haptic device, by means of a voice command or by operating the virtual reality viewer.
[0091] When an action space modification order is detected 104, the size of the action space is modified 106, increasing or decreasing the dimensions of the action space by a certain amount as a function of the received modification order detected. Next, the location of the modified action space is determined 107 (for example, the location of the center of the same) so that the absolute position of the virtual avatar in the virtual reality environment is not altered and also the relative position of the virtual avatar with respect to the action space before and after resizing. Finally, the workspace of the desktop force feedback haptic device is mapped 108 with the modified action space. These last two steps (107, 108) can be performed in any order.
[0092] When an action space relocation order is detected 110, the action space is moved 112 to a new location in the virtual reality environment based on the detected relocation order. In this case, whenever the action space moves a certain distance in a certain direction, the virtual avatar moves the same distance and in the same direction, since the remapping of both spaces (virtual and physical) must be consistent (and therefore the relative position of the virtual avatar within the action space must remain unchanged).
[0094] Figure 2A illustrates the execution of an action space modification order. In the particular case shown in the figure, it is an order to enlarge the size of the action space. The figure is divided into two virtual scenes 202, the upper one being the original scene and the lower one the resulting scene after applying a modification in the size of the action space.
[0096] In the upper part of Figure 2A a virtual scene 202, represented in a virtual reality viewer 204 of a user 212, shows the virtual avatar 206 of a haptic desktop force feedback device, through which the user interacts with virtual objects 208 of the virtual reality environment also represented in the virtual scene 202. The figure also shows the action space 210 (which is not necessarily shown in the virtual scene 202), which corresponds to the volume within which it is possible to moving virtual avatar 206 in response to movement of the end effector of the haptic desktop force feedback device generated by the user's hand 212. In Figure 2A, virtual avatar 206 is represented by a small sphere in the center of space action 210.
[0098] The lower part of the figure shows the enlargement of the dimensions of the action space 210, the modified action space 210 'keeping the original proportions between width, length and height of the action space 210. In the example of Figure 2A the virtual avatar 206 is centered in the action space 210, whereby the center of the action space 210 does not vary its position within the virtual scene 202, since the absolute position of the virtual avatar in the virtual reality environment does not vary and the relative position of the virtual avatar with respect to the action space is maintained before and after the resizing. As can be seen in Figure 2A, the virtual scene 202 represented in the virtual reality viewer 204 does not vary, only the size of the action space 210 is enlarged, which affects the mapping between the dimensions of the working space of the haptic device of desktop force feedback and the dimensions of the modified action space (that is, how much the virtual avatar 206 moves on each axis in the virtual scene of the virtual reality environment when the end effector of the haptic desktop force feedback device is shifts in the real world a certain amount on each axis). As noted above, the display of the action space 210 in the virtual scene 202 is optional.
[0100] The modification of the size of the action space 210 is therefore carried out without altering the absolute position or the relative position of the virtual avatar 206 within the action space 210. In the example of Figure 2A, the virtual avatar 206 remains positioned in the center of the action space 210 before and after the modification of the size of the action space 210, so that the center of the action space 210 does not vary its location. However, the position of the center of the action space 210 does vary in the event that the virtual avatar 206 is not positioned in the center of the action space 210 at the time of expanding its size, since what must remain motionless is virtual avatar 206.
[0102] Figure 2B shows an enlargement of the action space 210 where the virtual avatar 206 is not positioned in the center 214 of the action space 210. In the upper part of Figure 2B, a virtual object 208 is represented in a virtual scene 202 , and the action space 210 and the virtual avatar 206 of the haptic desktop force feedback device. The virtual avatar 206 is located at an upper end of the action space, displaced a distance "-a" on the X axis, a distance of zero on the Y axis, and a distance "b" on the Z axis with respect to the center. 214 of action space 210.
[0104] In the lower part of Figure 2B, the virtual scene 202 above is shown with the virtual object 208 and the virtual avatar 206 in the same positions, and with the modified action space 210 ', specifically enlarged to double in size in response to a corresponding order of modification of the action space. In this case, the center 214 of the action space 210 is moved a distance d towards a new position, modified center 214 ', to keep the relative position of the virtual avatar 206 unchanged before the enlargement (ie with respect to the action space 210 ) and after enlargement (ie with respect to the modified action space 210 '). In this case, by doubling the size of the action space 210, the original relative distances must also be doubled, so that the virtual avatar 206 is at a distance "-2a" on the X axis, a distance of zero on the Y axis. , and a distance "2b" on the Z axis with respect to the modified center 214 '. As previously indicated, in step 107 of the flow chart of Figure 1 the location of the modified action space 210 'and, therefore, the location of the modified center 214' is determined to keep the position of the virtual avatar 206 unaltered. (in absolute position in the virtual scene and in relative position with respect to the action spaces 210 and 210 ').
[0106] Returning to Figure 1, the method also checks whether the detection 110 of an action space repositioning order occurs, in which case the action space 210 is moved (112, 114), together with the virtual avatar 206, a determined vector displacement to a new location. The relocation order contains enough information to determine the position of the new location of the action space 210, for example the three Cartesian coordinates (X, Y, Z) that determine the absolute position of a point in space (eg the center ) of the action space 210, or the modification of the position (AX, AY, AZ) of the action space 210 in any of the Cartesian axes or in any other coordinate system. According to one embodiment, said command may comprise a modification of the angular position of the action space with respect to the user, a modification of the distance of the action space with respect to the user, or a combination of both.
[0108] Figure 3 represents the execution of an order to relocate the action space; in particular, the modification of the angular position of the action space 210 with respect to the user 212. As in the previous figure, Figure 3 is subdivided into two virtual scenes 202. The upper scene represents the instant in which the relocation mode is activated, since the virtual object 208 with which you want to interact is far from the action space 210. The user can then change their point of view, focusing their gaze on the virtual object 208 with which they want to start interaction, and it is in that angular position that the action space is placed, as represented in the lower scene, keeping the distance from the user constant (within the virtual reality environment).
[0110] The user 212 wearing the virtual reality viewer 204 can be viewed as a viewer positioned within the virtual reality environment in a position that would correspond to an intermediate position of their eyes. According to a preferred embodiment, the angular displacement (or modification of the angular position) of the action space 210 is determined based on a final orientation 308 of the virtual reality viewer 204. The orientation of the virtual reality viewer 204 can be defined for example as the line perpendicular to the segment 304 that joins the centers of the lenses of the virtual reality viewer 204 at its midpoint 306.
[0112] In Figure 3, the virtual reality viewer 204 is represented in the upper part with an initial orientation 302 centered on the action space 210. The user 212 initiates a repositioning command, and rotates the virtual reality viewer 204 a certain angle towards the left, until the gaze is focused on the virtual object 208. The final orientation 308 of the virtual reality viewer 204, centered on the virtual object 208, is included in the order to reposition the action space, and determines the new angular position of the action space 210 relative to user 212. The angular position of action space 210 can be defined, for example, relative to its geometric center. In the virtual scene 202 shown in the lower part of the figure, the repositioned action space 210 '' is represented, once its geometric center has moved to the new angular position (centered on the virtual object 208) determined by the final orientation 308 of the viewfinder, and keeping the distance to the user 212 constant. For the simplicity of the figure, only a two-dimensional lateral rotation is shown, corresponding to azimuth 9 in spherical coordinates. But the final orientation 308 of the virtual reality viewer 204 would preferably be determined in three-dimensional space, for example by azimuth angle 9 and polar angle 0 when spherical coordinates are used.
[0114] According to one embodiment, detection 110 of a repositioning command of the action space 210 is initiated by a repositioning start command and is terminated by a repositioning end command generated when the virtual reality viewer 204 is oriented in the direction. final orientation 308.
[0116] In one embodiment, the repositioning start command is generated by orienting the virtual reality viewer 204 for a specific time (eg 3 seconds) towards a specific area of the virtual environment (corresponding to the fixation of the user's gaze on a specific area. concrete), such as an area of the virtual environment (relocation area) in which a relocation command is displayed; and the repositioning end command is generated by maintaining the orientation of the virtual reality viewer 204 for a specified time (eg 3 seconds) in the final orientation 308, which determines the final angular position of the action space 210. Advantageously This embodiment allows interaction with the user only using the virtual reality viewer, without the need to use for example no buttons on the desktop force feedback haptic device, freeing its buttons for other functionality.
[0118] In another embodiment, the repositioning start command is generated by a first interaction with at least one button of the haptic desktop force feedback device and the repositioning end command is generated by a second interaction with at least one button on the device. desktop force feedback haptic when the virtual reality viewer 204 is oriented in the final orientation 308. The first and second interaction may consist, for example, of a single press or a double press of one of the buttons on the device. desktop force feedback haptic. In one embodiment, the repositioning start command is generated through said first interaction when the virtual reality viewer 204 has a certain orientation, for example when it is oriented towards the action space 210.
[0120] Alternatively, the relocation start command and relocation end command can be generated using voice commands.
[0122] The repositioning command may comprise, alternatively or in addition to the foregoing, a modification of the distance of the action space 210 with respect to the user 212. The distance of the action space 210 to the user 212 can be considered, for example, as the distance from a reference point of the action space 210 (eg the geometric center 214) to the intermediate point 306 of the lenses (more specifically, to the representation of said intermediate point in the virtual reality environment).
[0124] Figure 4 illustrates the schematic operation of the relocation mode when the user modifies the distance at which the action space is located (with respect to the position of the user's eyes in the virtual world), thus allowing to reach objects that are in the distance. greater distance (d2) or less distance (d1) than the current distance (dü) of the action space.
[0126] In Figure 4 the action space 210 is represented in its initial position, at a distance d0 with respect to the user 212 (measured in the figure with respect to the point of the action space 210 closest to the user 212, although it could be measured with respect to any other point, for example the geometric center of the action space 210), and the action space repositioned 210 '' in two different locations, at a distance d1 and a distance d2 from the user, respectively. In the first case, distance di, the user 212 of the action space 210 is approached; in the second case, distance d2, the action space is moved away from the user 212. The repositioned action space 210 '' can be shown in the new virtual scene 212, for example by means of a shaded space.
[0128] In one embodiment, the repositioning command comprises a combination of modifying the angular position of the action space relative to the user (Figure 3) and modifying the distance of the action space relative to the user (Figure 4). In this case, if the position of the geometric center of the action space 210 is considered in spherical coordinates, the angular position (azimuth angle 9 and polar angle 0) of the repositioned action space 210 '' is determined by a final orientation 308 of the viewfinder. virtual reality 204, and the distance (radial coordinate p) of the repositioned action space 210 '' to the user 212 is determined based on the initial distance d0 modified by an instruction or command of the user (final distance d1 or d2, depending on if it is a zoom in or out instruction).
[0130] The ways of executing modification and relocation orders by the user can be very diverse. Thus, for example, the modification of the size of the action space and the modification of the distance of the action space can be determined by pressing one or more buttons of the desktop haptic force feedback device.
[0132] According to one embodiment, a double press of one of the buttons of the haptic device (mode switch button) can be used as a mechanism for switching between the modes of operation, interaction and relocation. While in interaction mode, modifying the size of the action space is done continuously by using two buttons built into the haptic device, a zoom button to enlarge the action space and a decrease button. to reduce the size. These same buttons, enlargement button and reduction button, can also be used as a mechanism for the continuous modification of the distance of the action space to the user within the relocation mode. The enlargement / reduction speed in both cases (of size or distance) will be predetermined by software, being a parameter configurable by the user.
[0133] Given a 360-degree virtual scene, the corresponding action space is initially placed just in front of the user's base starting position, sized according to the virtual environment and in a format that allows a direct correlation with the physical workspace of the haptic device. of desktop force feedback. Thus, if the physical workspace of the haptic device has specific dimensions L x W x F (length, width and depth), the action space present in the scene will maintain the proportions defined by said constants (L, A, F).
[0135] In interaction mode, the user can progressively increase the size of the action space while holding down one of the buttons on the haptic device (the enlargement button). In this way, a fluid expansion is achieved that the user can adapt to their preferences. The reduction of the action space can also be carried out by long pressing of another of the buttons (the reduction button). Both the increase and decrease in the size of the action space have logical limits, based on the size of the objects present in the virtual scene.
[0137] As long as the user is facing the action space 210, they can double-click one of the buttons on the haptic device (the mode switch button) to switch to the repositioning mode. The mode change button may correspond to one of the buttons used to enlarge / reduce the size or distance. Once in relocation mode, the user can then move their gaze to the angular position of the virtual environment where they want to place the action space (final orientation 308 of virtual reality viewer 204). Also, the user can use two buttons (enlarge button and reduce button) of the haptic device within the relocation mode to modify the distance of the user to the action space. When the user double-clicks the mode change button again, he will return to interaction mode, once the action space has been fixed to its new location.
[0139] The present invention solves a problem inherent in desktop force feedback haptic devices, their small workspace. The proposed invention allows an interaction by means of the haptic device in any part of the virtual scene due to the relocation of the action space and with a desired level of precision obtained by modifying the size of the action space.
[0140] The application of the present invention in haptic devices allows their use in a natural and precise way as main interaction devices in different types of simulators and virtual reality experiences (medical sector, automobile industry, aerospace industry, games, etc.), as well as in robot teleoperation tasks in different scenarios.
[0142] Figure 5 shows the different elements of the interaction system in virtual reality environments using a haptic desktop force feedback device, according to a possible embodiment of the present invention. In particular, the system 500 comprises a desktop haptic force feedback device 502 (shown in the figure incompletely and schematically) through which a user 212 interacts with a virtual reality environment; a graphic processing unit 504 (such as a GPU, a CPU, and in general any unit or electronic device with data processing capacity) in charge of generating virtual scenes 202 of the virtual reality environment in which a virtual avatar 206 is represented haptic desktop force feedback device 502 interacting with virtual objects 208 in a virtual environment; and a virtual reality viewer 204 for displaying to the user 212 the virtual scenes generated by the graphic processing unit 504.
[0144] According to the embodiment shown in Figure 5, the graphic processing unit 504 is a separate and independent entity from the virtual reality viewer 204. Said graphic processing unit 504 can be implemented, for example, by a wired computer 506 (or via high-frequency wireless connection according to recent technologies, eg WiGig at 60 Ghz) to the virtual reality viewer 204. However, in another embodiment (not shown in the figures) the graphic processing unit 504 is integrated into the viewer itself. virtual reality 204 (ie standalone virtual reality headset, such as Oculus Quest).
[0146] The graphics processing unit 504 is configured to define, in a virtual reality environment, an action space 210 that corresponds to the workspace 510 of the haptic desktop force feedback device 502 in the real environment. The graphic processing unit 504 is configured to detect modification orders 512 and relocation orders 514 of the action space 210. As explained above, said orders are generated in different ways by user action. 212, for example by pressing a button 508 of the desktop force feedback haptic device 502.
[0148] In the example shown in Figure 5 the desktop force feedback haptic device 502 is configured to send to the graphics processing unit 504, in response to a button press 508 by the user 212, a modification command 512 containing information to modify the size of the action space 210. When the graphic processing unit 504 detects the modification order 512, it increases or decreases the dimensions of the action space 210 by a certain amount based on the detected modification order 512 ( enlargement or reduction). The enlargement or reduction of the action space 210 can be performed in a discrete manner (for example, a single 2x enlargement according to the information contained in the modification order 512) or continuously (for example, with a ratio of enlargement speed progressive for the entire time the modification order is being received 512).
[0150] The graphic processing unit 504 determines the location of the modified action space 210 'to keep the position of the virtual avatar 206 unchanged (as explained in the example of Figure 2B), and maps the workspace 510 of the haptic device of feedback of desktop forces 502 with the modified action space 210 ', matching them (for example, a displacement d on the X-axis of the haptic device in the workspace 510 corresponds to a displacement d' on the corresponding axis within the action space 210 of the virtual reality environment). The graphics processing unit 504 sends via cable 506 to the virtual reality viewer 204 virtual scenes 202 at a certain refresh rate (eg 90 Hz) showing the user 212 the modified action space 210 '. Sending the virtual scenes 202 with the updated workspace size is optional, since workspace 210 does not have to be displayed in virtual scene 202.
[0152] The graphics processing unit 504 is also configured to detect a repositioning command 514 of the action space 210, and to move the action space 210 to a new location in the virtual reality environment based on the repositioning command 514 detected.
[0154] The desktop haptic force feedback device 502 may be configured to send to the graphics processing unit 504, in response to, for example, the button press 508 by user 212, a relocation command 514 (shown in dashed lines) containing information used to move action space 210 to a new location. For example, the relocation command may include information to extend or decrease the distance from the action space 210 to the user 212 by a certain amount. Or, alternatively, the graphics processing unit 504 can extend or reduce the distance from the action space 210 to the user. user 212 while receiving a relocation order 514 (in the event that said order is being sent continuously while user 212 is holding down a button 508).
[0156] The haptic desk force feedback device 502 may be configured to detect a modification command 512 or a relocation command 514, for example by sensing the user 212 pressing of a button 508 of the haptic device. desktop force feedback 502. The haptic desktop force feedback device 502 sends, via cable 516 or wirelessly, the conveniently processed detected command to graphic processing unit 504, which detects the command upon receipt.
[0158] The graphics processing unit 504 may detect a relocation command 514 by information received from the haptic desktop force feedback device 502 or in other ways, for example by analyzing the information provided by the virtual reality viewer 204. Thus , for example, when the user 212 focuses his gaze (detected by the orientation of the virtual reality viewer 204) for a specified time in a relocation area 518 of the virtual reality environment (for example, a relocation area 518 represented in the virtual scene 202 close to the action space 210, as shown in Figure 5, or a relocation area 518 that coincides with in the action space itself 210), the graphic processing unit 504 interprets it as a relocation order 514 to move the action space 210 to a new location, and in particular to modify the angular position of the action space 210 with respect to the user io 212. In this case, the detection of the repositioning order 514 of the action space 210 is started by orienting the virtual reality viewer 204 for a certain time towards the repositioning area 518 of the virtual environment, and is ended by holding the orientation of the virtual reality viewer 204 for a certain time in a final orientation 308. The modification of the angular position of the action space 210 is determined as a function of said final orientation 308 of the virtual reality viewer 204. In this embodiment, although the sending of a relocation order is represented in dashed lines in Figure 5 514 from the virtual reality viewer 204 to the graphic processing unit 504, in reality the virtual reality viewer 204 does not send a repositioning order 514 as such, but instead sends information regarding the orientation of the viewer which, when it fulfills one or more conditions, it can be detected or interpreted as a relocation order.
[0160] Depending on the order received or the configuration of the graphic processing unit 504, the modification of the size of the action space 210 or the distance to the user can be carried out in a timely manner, so that each press of the button 508 of the haptic device Desktop Force Feedback 502 involves scaling up or down in size / distance with a certain preset value, or continuously, where holding down the 508 button will progressively zoom in / out until the user stops pressing the button , at which point it is considered to have reached the desired level of expansion.
[0162] The method 100 may comprise sending, by the haptic desktop force feedback device 502 to the graphic processing unit 504, one or more modification orders 512 of the action space 210, where each modification order 512 contains information used by the graphics processing unit 504 to perform the increase or decrease in the size of the action space 210.
[0164] Figure 6A illustrates the content of a modification command 512, which may include modification type 602, either increase in size or decrease in size. The modifying command 512 may also include a degree, level, or coefficient of size modification 604 that the graphics processing unit 504 must apply, for example an enlargement coefficient or a reduction coefficient. Alternatively, the graphics processing unit 504 may have determined by a configurable parameter the level of size modification to be applied when an increase or decrease command is received; in that case it is not necessary to include a size modification coefficient 604 in the modification order 512. In another embodiment the modification order only includes a size modification coefficient 604, which indicates per se whether it is an increase or decrease in size and the level of increase or decrease to be applied (eg a coefficient of 0.5 indicates a decrease to half the size and a coefficient of 2 indicates an increase to twice the size). These size modification coefficients 604 included in the modification order 512 may be a configurable parameter of the haptic desktop force feedback device 502.
[0165] In one embodiment, the user presses a first button on the haptic desk force feedback device 502 to send an increase in size command, and presses a second button to send a decrease in size command. In this particular case, the modification order 512 is sent only once, with information related to the type of modification 602 assigned to the button pressed. The graphics processing unit 504 applies, once it receives the modification order 512, an increase or decrease ratio established in configurable internal parameters stored in memory.
[0167] In another embodiment, the user holds down a first button on the haptic desk force feedback device 502 to progressively increase the size of the action space 210 and holds down a second button on the haptic desk force feedback device 502 to progressively decrease. the size of the action space 210. In this particular case, the modification order 512 is sent repeatedly during the time the button is pressed, with information related to the type of modification 602 assigned to the pressed button. During the time that the graphic processing unit 504 receives the modification order 512, it progressively applies an increase or decrease ratio determined according to configurable parameters stored in memory.
[0169] The method 100 may comprise sending, by the haptic desk force feedback device 502 to the graphic processing unit 504, one or more repositioning orders 514 of the action space 210, where each repositioning order 514 contains information used by the graphics processing unit 504 to perform the movement of the action space 210 to a new location, for example by indicating a certain offset of the action space 210 relative to the current position or by indicating the final position of the action space 210.
[0171] Figure 6B shows, according to one embodiment, the content of a repositioning command 514, which may include a modified angular position 606, a modified user distance 608, or a combination of both. The angular position and distance information can be provided in absolute or relative to the current position. In one embodiment, to determine the modified angular position 606, the process of detecting 110 of a repositioning command 514 by the graphic processing unit 504 comprises detecting a repositioning start command and an end of repositioning command. relocation. Through the repositioning start command, the graphic processing unit 504 detects the user's instruction to initiate a repositioning command, and in particular for the modification of the angular position of the action space 210. Through the repositioning end command the unit graphic processing 504 determines the modification of the angular position 606 as a function of a final orientation 308 of the virtual reality viewer 204 at the time of receiving said command.
[0173] In one embodiment, the relocation start command and the relocation end command may be the same command. For example, the user can double-press a button on the haptic desk force feedback device 502, and said double-press can determine the start and end of detection 110 of a relocation command 514, so that the modification The angular position 606 of the action space 210 is determined based on the final orientation 308 of the virtual reality viewer 204 at the time of double-clicking.
[0175] Finally, the method 100 may also comprise the step of representing, in the virtual reality viewer 204, at least one virtual scene 202 in which the modified action space 210 'or the repositioned action space 210' 'is shown, in function of the type of order detected and executed (ie modification order 512 or repositioning order 514). Also represented in virtual scene 202 is virtual avatar 206 of desktop force feedback haptic device 502, with its position suitably updated.
权利要求:
Claims (22)
[1]
1. Method of interaction in virtual reality environments using a desktop force feedback haptic device, where a virtual avatar (206) of a desktop force feedback haptic device (502) is represented in virtual scenes (202) by means of a virtual reality viewer (204) carried by a user (212), characterized in that the method (100) comprises:
defining (102) an action space (210) in a virtual reality environment, where said action space (210) corresponds to a workspace (510) of the haptic desktop force feedback device (502);
upon detection (104) of a modification order (512) of the action space (210):
modifying (106) the size of the action space (210), increasing or decreasing the dimensions of the action space (210) by a certain amount as a function of the modification order (512) detected;
determining (107) the location of the modified action space (210 ') to keep the position of the virtual avatar (206) unaltered; Y
mapping (108) the workspace (510) of the haptic desktop force feedback device (502) to the modified action space (210 ');
upon detection (110) of a repositioning order (514) of the action space (210), moving (112) the action space (210) to a new location in the virtual reality environment as a function of the repositioning order ( 514) detected.
[2]
Method according to claim 1, characterized in that the repositioning order (514) of the action space (210) comprises:
a modification of the angular position (606) of the action space (210) with respect to the user (212);
a modification of the distance (608) of the action space (210) with respect to the user (212); or
a combination of the above.
[3]
3. Method according to claim 2, characterized in that the modification of the angular position (606) of the action space (210) is determined as a function of a final orientation (308) of the virtual reality viewer (204).
[4]
Method according to claim 3, characterized in that the detection (110) of a repositioning command (514) of the action space (210) is initiated by a repositioning start command, and is terminated by an end of repositioning command. relocation generated when the virtual reality viewer (204) is oriented in the final orientation (308).
[5]
Method according to claim 4, characterized in that the repositioning start command is generated by orienting the virtual reality viewer (204) for a specified time towards a repositioning area (518) of the virtual environment, and the command to Repositioning end is generated by maintaining the orientation of the virtual reality viewer (204) for a certain time in the final orientation (308).
[6]
Method according to claim 4, characterized in that the repositioning start command is generated by a first interaction with at least one button of the haptic desk force feedback device (502) and the repositioning end command is generated by a second interaction with at least one button of the desktop force feedback haptic device (502).
[7]
Method according to claim 4, characterized in that the relocation start command and the relocation end command are generated by voice commands.
[8]
Method according to any one of claims 2 to 7, characterized in that the modification of the distance of the action space (210) with respect to the user (212) is determined by pressing at least one button (508) of the haptic device desktop force feedback system (502).
[9]
Method according to any of the preceding claims, characterized in that the modification of the size of the action space (210) is determined by pressing at least one button (508) of the haptic desk force feedback device (502).
[10]
Method according to any of the preceding claims, characterized in that it comprises sending, by the haptic device for feedback of desktop forces (502) to the graphic processing unit (504), at least one order to modify (512) of the action space (210), where each modification order (512) contains information (602, 604) used by the graphic processing unit (504) to perform the increase or decrease in the size of the action space (210).
[11]
Method according to any of the preceding claims, characterized in that it comprises sending, by the haptic desktop force feedback device (502) to the graphic processing unit (504), at least one repositioning order (514) of the action space (210), where each relocation order (514) contains information (606, 608) used by the graphics processing unit (504) to move the action space (210) to a new location.
[12]
12. Interaction system in virtual reality environments using a haptic desktop force feedback device, comprising:
a desktop force feedback haptic device (502) for interacting with a virtual reality environment;
a graphic processing unit (504) in charge of generating virtual scenes (202) of the virtual reality environment in which a virtual avatar (206) of the haptic desktop force feedback device (502) is represented;
a virtual reality viewer (204) for displaying the generated virtual scenes (202) to a user (212);
characterized in that the graphic processing unit (504) is configured to:
defining an action space (210) in the virtual reality environment, where said action space (210) corresponds to a workspace (510) of the haptic desktop force feedback device (502);
upon detection of a modification order (512) of the action space (210): modify the size of the action space (210), increasing or decreasing the dimensions of the action space (210) by a certain amount depending on the modification order (512) detected;
determining the location of the modified action space (210 ') to keep the position of the virtual avatar (206) unaltered; Y
mapping the workspace (510) of the haptic desktop force feedback device (502) to the modified action space (210 ');
upon detection of a repositioning order (514) of the action space (210), moving the action space (210) to a new location in the virtual reality environment as a function of the repositioning order (514) detected.
[13]
13. System according to claim 12, characterized in that the repositioning order (514) of the action space (210) comprises:
a modification of the angular position (606) of the action space (210) with respect to the user (212);
a modification of the distance (608) of the action space (210) with respect to the user (212); or
a combination of the above.
[14]
System according to claim 13, characterized in that the modification of the angular position (606) of the action space (210) is determined as a function of a final orientation (308) of the virtual reality viewer (204).
[15]
System according to claim 14, characterized in that the detection of a relocation order (514) of the action space (210) is initiated by a relocation start command, and is ended by a relocation end command generated when The virtual reality viewer (204) is oriented in the final orientation (308).
[16]
16. System according to claim 15, characterized in that the repositioning start command is generated by orienting the virtual reality viewer (204) for a certain time towards a repositioning area (518) of the virtual environment, and the command to Repositioning end is generated by maintaining the orientation of the virtual reality viewer (204) for a certain time in the final orientation (308).
[17]
System according to claim 15, characterized in that the repositioning start command is generated by a first interaction with at least one button of the haptic desk force feedback device (502) and the repositioning end command is generated by a second interaction with at least one button of the desktop force feedback haptic device (502).
[18]
System according to claim 15, characterized in that the repositioning start command and the repositioning end command are generated by voice commands.
[19]
System according to any of claims 13 to 18, characterized in that the modification of the distance (608) of the action space (210) with respect to the user (212) is determined by pressing at least one button (508) of the haptic desk force feedback device (502).
[20]
System according to any one of claims 12 to 19, characterized in that the modification of the size of the action space (210) is determined by pressing at least one button (508) of the haptic desk force feedback device (502 ).
[21]
21. A program product comprising program instruction means for carrying out the method defined in any one of claims 1 to 11 when the program is executed on a processor.
[22]
22. A program product according to claim 21, stored on a program support medium.
类似技术:
公开号 | 公开日 | 专利标题
JP2019177134A|2019-10-17|Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US20190005848A1|2019-01-03|Virtual reality training, simulation, and collaboration in a robotic surgical system
US11013559B2|2021-05-25|Virtual reality laparoscopic tools
JP2011521318A|2011-07-21|Interactive virtual reality image generation system
ES2679125T3|2018-08-22|System for interactive visualization in real time of muscular forces and joint pairs in the human body
Wang et al.2012|Configuration-based optimization for six degree-of-freedom haptic rendering for fine manipulation
ES2705531T3|2019-03-25|System with 3D user interface integration
Brown et al.2006|Magic lenses for augmented virtual environments
US20190000578A1|2019-01-03|Emulation of robotic arms and control thereof in a virtual reality environment
JP2009276996A|2009-11-26|Information processing apparatus, and information processing method
CN103077633A|2013-05-01|Three-dimensional virtual training system and method
CN101426446A|2009-05-06|Apparatus and method for haptic rendering
CA2958840A1|2016-02-25|Medical procedure simulator
Romanus et al.2019|Mid-air haptic bio-holograms in mixed reality
Hirota et al.1995|Providing force feedback in virtual environments
Wang et al.2009|Haptic rendering for dental training system
Ellis1995|Origins and elements of virtual
US20190355278A1|2019-11-21|Virtual reality surgical system including a surgical tool assembly with haptic feedback
ES2807674B2|2021-09-29|PROGRAM METHOD, SYSTEM AND PRODUCT FOR INTERACTION IN VIRTUAL REALITY ENVIRONMENTS THROUGH HAPTIC DESKTOP FORCES FEEDBACK DEVICE
Teleb et al.2012|Data glove integration with 3d virtual environments
Panchaphongsaphak et al.2007|Three-dimensional touch interface for medical education
Vlasov et al.2012|Haptic rendering of volume data with collision determination guarantee using ray casting and implicit surface representation
Dorfmüller-Ulhaas2002|Optical tracking: from user motion to 3D interaction
CN108388347B|2021-05-25|Interaction control method and device in virtual reality, storage medium and terminal
ES2716012B2|2020-07-22|INTERACTION SYSTEM AND METHOD IN VIRTUAL ENVIRONMENTS USING HAPTIC DEVICES
同族专利:
公开号 | 公开日
ES2807674B2|2021-09-29|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US6417638B1|1998-07-17|2002-07-09|Sensable Technologies, Inc.|Force reflecting haptic interface|
WO2005043365A2|2003-10-30|2005-05-12|Sensable Technologies, Inc.|Force reflecting haptic interface|
US20070171194A1|2005-12-22|2007-07-26|Francois Conti|Workspace expansion controller for human interface systems|
ES2716012A1|2018-09-28|2019-06-07|Univ Leon|SYSTEM AND METHOD OF INTERACTION IN VIRTUAL ENVIRONMENTS USING HOPTIC DEVICES |
法律状态:
2021-02-23| BA2A| Patent application published|Ref document number: 2807674 Country of ref document: ES Kind code of ref document: A1 Effective date: 20210223 |
2021-09-29| FG2A| Definitive protection|Ref document number: 2807674 Country of ref document: ES Kind code of ref document: B2 Effective date: 20210929 |
优先权:
申请号 | 申请日 | 专利标题
ES202031083A|ES2807674B2|2020-10-29|2020-10-29|PROGRAM METHOD, SYSTEM AND PRODUCT FOR INTERACTION IN VIRTUAL REALITY ENVIRONMENTS THROUGH HAPTIC DESKTOP FORCES FEEDBACK DEVICE|ES202031083A| ES2807674B2|2020-10-29|2020-10-29|PROGRAM METHOD, SYSTEM AND PRODUCT FOR INTERACTION IN VIRTUAL REALITY ENVIRONMENTS THROUGH HAPTIC DESKTOP FORCES FEEDBACK DEVICE|
[返回顶部]